Discriminatively trained sparse inverse covariance matrices for low resource acoustic modeling

نویسندگان

  • Weibin Zhang
  • Pascale Fung
چکیده

We propose a method to discriminatively train acoustic models with sparse inverse covariance (precision) matrices in order to improve the model robustness when training data is insufficient. Acoustic models with sparse inverse covariance matrices were previously proposed to address the problem of over-fitting when training data is inadequate. Since many of the entries of the inverse covariance matrices are driven to zero, the number of free parameters to be estimated is reduced. However, previously acoustic models using sparse inverse covariance matrices were trained using maximum likelihood (ML) training. It is well-known that discriminative training can further improve the recognition accuracy. Therefore, for the first time, we study the problem of training acoustic models with sparse inverse covariance matrices using the discriminative training method. An L1 regularization term is added to the traditional objective function for discriminative training to penalize complex models and to automatically sparsify the inverse covariance matrices. The new objective function is optimized by maximizing a weak-sense auxiliary function. Experimental results on the Wall Street Journal data set show that our method effectively regularizes the model complexity and allows more Gaussian components to be trained. Therefore it can better model the non-Gaussian nature of the speech feature vectors. Compared with the standard maximum mutual information (MMI) training method, our proposed method can significantly improve the recognition accuracy.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Banded Precision Matrices for Low Resource Speech Recognition

We propose to use sparse banded precision matrices for speech recognition when there is insufficient training data. Previously we proposed a method to drive the structure of precision matrices to sparse under the HMM framework during training. The recognition accuracy of this compact model is shown to be better than full covariance or diagonal covariance systems. In this paper we propose to mod...

متن کامل

Factored sparse inverse covariance matrices

Most HMM-based speech recognition systems use Gaussian mixtures as observation probability density functions. An important goal in all such systems is to improve parsimony. One method is to adjust the type of covariance matrices used. In this work, factored sparse inverse covariance matrices are introduced. Based on U DU factorization, the inverse covariance matrix can be represented using line...

متن کامل

A Well-Conditioned and Sparse Estimation of Covariance and Inverse Covariance Matrices Using a Joint Penalty

We develop a method for estimating well-conditioned and sparse covariance and inverse covariance matrices from a sample of vectors drawn from a sub-Gaussian distribution in high dimensional setting. The proposed estimators are obtained by minimizing the quadratic loss function and joint penalty of `1 norm and variance of its eigenvalues. In contrast to some of the existing methods of covariance...

متن کامل

JPEN Estimation of Covariance and Inverse Covariance Matrix A Well-Conditioned and Sparse Estimation of Covariance and Inverse Covariance Matrices Using a Joint Penalty

We develop a method for estimating well-conditioned and sparse covariance and inverse covariance matrices from a sample of vectors drawn from a sub-gaussian distribution in high dimensional setting. The proposed estimators are obtained by minimizing the quadratic loss function and joint penalty of `1 norm and variance of its eigenvalues. In contrast to some of the existing methods of covariance...

متن کامل

Modeling with a subspace constraint on inverse covariance matrices

We consider a family of Gaussian mixture models for use in HMM based speech recognition system. These “SPAM” models have state independent choices of subspaces to which the precision (inverse covariance) matrices and means are restricted to belong. They provide a flexible tool for robust, compact, and fast acoustic modeling. The focus of this paper is on the case where the means are unconstrain...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013